347 research outputs found

    Combinatorial Information Theory: I. Philosophical Basis of Cross-Entropy and Entropy

    Full text link
    This study critically analyses the information-theoretic, axiomatic and combinatorial philosophical bases of the entropy and cross-entropy concepts. The combinatorial basis is shown to be the most fundamental (most primitive) of these three bases, since it gives (i) a derivation for the Kullback-Leibler cross-entropy and Shannon entropy functions, as simplified forms of the multinomial distribution subject to the Stirling approximation; (ii) an explanation for the need to maximize entropy (or minimize cross-entropy) to find the most probable realization; and (iii) new, generalized definitions of entropy and cross-entropy - supersets of the Boltzmann principle - applicable to non-multinomial systems. The combinatorial basis is therefore of much broader scope, with far greater power of application, than the information-theoretic and axiomatic bases. The generalized definitions underpin a new discipline of ``{\it combinatorial information theory}'', for the analysis of probabilistic systems of any type. Jaynes' generic formulation of statistical mechanics for multinomial systems is re-examined in light of the combinatorial approach. (abbreviated abstract)Comment: 45 pp; 1 figure; REVTex; updated version 5 (incremental changes

    Combinatorial Entropies and Statistics

    Full text link
    We examine the {combinatorial} or {probabilistic} definition ("Boltzmann's principle") of the entropy or cross-entropy function H∝ln⁑WH \propto \ln \mathbb{W} or Dβˆβˆ’ln⁑PD \propto - \ln \mathbb{P}, where W\mathbb{W} is the statistical weight and P\mathbb{P} the probability of a given realization of a system. Extremisation of HH or DD, subject to any constraints, thus selects the "most probable" (MaxProb) realization. If the system is multinomial, DD converges asymptotically (for number of entities N \back \to \back \infty) to the Kullback-Leibler cross-entropy DKLD_{KL}; for equiprobable categories in a system, HH converges to the Shannon entropy HShH_{Sh}. However, in many cases W\mathbb{W} or P\mathbb{P} is not multinomial and/or does not satisfy an asymptotic limit. Such systems cannot meaningfully be analysed with DKLD_{KL} or HShH_{Sh}, but can be analysed directly by MaxProb. This study reviews several examples, including (a) non-asymptotic systems; (b) systems with indistinguishable entities (quantum statistics); (c) systems with indistinguishable categories; (d) systems represented by urn models, such as "neither independent nor identically distributed" (ninid) sampling; and (e) systems representable in graphical form, such as decision trees and networks. Boltzmann's combinatorial definition of entropy is shown to be of greater importance for {"probabilistic inference"} than the axiomatic definition used in information theory.Comment: Invited contribution to the SigmaPhi 2008 Conference; accepted by EPJB volume 69 issue 3 June 200

    Jaynes' MaxEnt, Steady State Flow Systems and the Maximum Entropy Production Principle

    Full text link
    Jaynes' maximum entropy (MaxEnt) principle was recently used to give a conditional, local derivation of the ``maximum entropy production'' (MEP) principle, which states that a flow system with fixed flow(s) or gradient(s) will converge to a steady state of maximum production of thermodynamic entropy (R.K. Niven, Phys. Rev. E, in press). The analysis provides a steady state analog of the MaxEnt formulation of equilibrium thermodynamics, applicable to many complex flow systems at steady state. The present study examines the classification of physical systems, with emphasis on the choice of constraints in MaxEnt. The discussion clarifies the distinction between equilibrium, fluid flow, source/sink, flow/reactive and other systems, leading into an appraisal of the application of MaxEnt to steady state flow and reactive systems.Comment: 6 pages; paper for MaxEnt0
    • …
    corecore